50 - Recap Clip 8.12: Regression and Classification with Linear Models (Part 2) [ID:30453]
22 von 22 angezeigt

The next thing we did was we looked at this idea.

In the multivariate case, in our example, we had house prices that we tried to learn

a model for house prices having one argument, say square feet, that was our example.

But often we have multiple factors, so what we really need is multivariate regression.

And the upshot is it works exactly like the univariate case, only you always have a vector

instead of a single number.

We have n plus 1 instead of two numbers, which gives you vectors and you're going to use

dot products.

But apart from that and a couple of tricks, nothing much happens.

In particular, you can actually have analytic solutions, only they become expressions of

vector calculus.

And you have the problem, the added problem that you want to regularize normally because

if you have multiple dimensions, some dimensions might be irrelevant.

And there you might actually get cases of overfitting, where you're actually learning

a concrete parameter or a range of parameters in dimensions which, if you look closely,

that are actually irrelevant.

And if you have irrelevant dimensions, then you really want to get rid of them.

And the way to do that is with regularization.

That's by introducing into the functional where we're minimizing by introducing some

kind of a complexity term that we kind of minimize conjointly with the others, which

prefers simpler solutions.

And by Occam's razor, those are the better ones.

Teil eines Kapitels:
Recaps

Zugänglich über

Offener Zugang

Dauer

00:02:30 Min

Aufnahmedatum

2021-03-30

Hochgeladen am

2021-03-31 11:08:08

Sprache

en-US

Recap: Regression and Classification with Linear Models (Part 2)

Main video on the topic in chapter 8 clip 12.

Einbetten
Wordpress FAU Plugin
iFrame
Teilen